Faster Eigenvector Computation via Shift-and-Invert Preconditioning

نویسندگان

  • Dan Garber
  • Elad Hazan
  • Chi Jin
  • Sham M. Kakade
  • Cameron Musco
  • Praneeth Netrapalli
  • Aaron Sidford
چکیده

We give faster algorithms and improved sample complexities for the fundamental problem of estimating the top eigenvector. Given an explicit matrix A ∈ Rn×d, we show how to compute an approximate top eigenvector of A>A in time Õ ([ nnz(A) + d sr(A) gap2 ] · log 1/ ) . Here nnz(A) is the number of nonzeros in A, sr(A) is the stable rank, and gap is the relative eigengap. We also consider an online setting in which, given a stream of i.i.d. samples from a distribution D with covariance matrix Σ and a vector x0 which is an O(gap) approximate top eigenvector for Σ, we show how to refine x0 to an approximation using O ( v(D) gap· ) samples from D. Here v(D) is a natural notion of variance. Combining our algorithm with previous work to initialize x0, we obtain improved sample complexities and runtimes under a variety of assumptions on D. We achieve our results via a robust analysis of the classic shift-and-invert preconditioning method. This technique lets us reduce eigenvector computation to approximately solving a series of linear systems with fast stochastic gradient methods. Proceedings of the 33 rd International Conference on Machine Learning, New York, NY, USA, 2016. JMLR: W&CP volume 48. Copyright 2016 by the author(s).

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Robust Shift-and-Invert Preconditioning: Faster and More Sample Efficient Algorithms for Eigenvector Computation

In this paper we provide faster algorithms and improved sample complexities for approximating the top eigenvector of a matrix A>A. In particular we give the following results for computing an approximate eigenvector i.e. some x such that x>A>Ax ≥ (1− )λ1(AA): • Offline Eigenvector Estimation: Given an explicit matrix A ∈ Rn×d, we show how to compute an approximate top eigenvector in time Õ ([ n...

متن کامل

Efficient coordinate-wise leading eigenvector computation

We develop and analyze efficient ”coordinatewise” methods for finding the leading eigenvector, where each step involves only a vector-vector product. We establish global convergence with overall runtime guarantees that are at least as good as Lanczos’s method and dominate it for slowly decaying spectrum. Our methods are based on combining a shift-and-invert approach with coordinate-wise algorit...

متن کامل

On Large-Scale Diagonalization Techniques for the Anderson Model of Localization

We propose efficient preconditioning algorithms for an eigenvalue problem arising in quantum physics, namely the computation of a few interior eigenvalues and their associated eigenvectors for the largest sparse real and symmetric indefinite matrices of the Anderson model of localization. We compare the Lanczos algorithm in the 1987 implementation by Cullum and Willoughby with the shift-and-inv...

متن کامل

Inexact Shift-and-invert Arnoldi’s Method and Implicit Restarts with Preconditioning for Eigencomputations

We consider the computation of a few eigenvectors and corresponding eigen-values of a large sparse nonsymmetric matrix. In order to compute eigenvaluesin an isolated cluster around a given shift we apply shift-and-invert Arnoldi’smethod with and without implicit restarts. For the inner iterations we useGMRES as the iterative solver. The costs of the inexact solves are measured<l...

متن کامل

Two Iterative Algorithms for Computing the Singular Value Decomposition from Input/Output Samples

The Singular Value Decomposition (SVD) is an important tool for linear algebra and can be used to invert or approximate matrices. Although many authors use "SVD" synonymously with "Eigenvector Decomposition" or "Principal Components Transform", it is important to realize that these other methods apply only to symmetric matrices, while the SVD can be applied to arbitrary nonsquare matrices. This...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2016